Now that we understand how pivot selection dictates the balance of the partitions, we can formally analyze the time complexity $T(n)$ of Quick Sort based on the resulting recurrence relation. The overall efficiency of Quick Sort depends entirely on whether the array $A$ of size $n$ is split symmetrically or asymmetrically.
- The complexity is determined by two factors: the number of levels in the recursion tree (the depth) and the total partitioning work done across all recursive calls at each level.
- Best and Average Case: When the pivot consistently provides a balanced split (near $n/2$), the recursion depth is proportional to $\log n$. Since the total work done partitioning the elements across any given level sums up to $O(n)$, the total complexity is $O(n \log n)$. This is why Quick Sort is typically the fastest in-place sort.
- Worst Case: If a poor pivot strategy is used (e.g., always selecting the smallest element on an already sorted array), the split is severely unbalanced ($0$ and $n-1$). This forces the algorithm to recurse $n$ times, resulting in a recursion depth of $O(n)$. The total cost accumulates to $O(n^2)$.
- Because Quick Sort's worst case is less common in practice than Merge Sort's consistent $O(n \log n)$, Quick Sort remains preferred for its excellent cache performance and low constant factors.
Mathematical Analysis: Recurrence Relations
Worst Case Recurrence Relation (Unbalanced Split)
$$T(n) = T(n-1) + T(0) + O(n)$$ $$T(n) = T(n-1) + O(n) \implies O(n^2)$$Best/Average Case Recurrence Relation (Balanced Split)
$$T(n) = 2T(n/2) + O(n) \implies O(n \log n)$$General Case Recurrence Relation (Splits $k$ and $n-1-k$)
$$T(n) = T(k) + T(n-1-k) + O(n)$$Assumptions: The $O(n)$ term represents the cost of the partitioning step itself. $k$ is the size of the first partition.